9 research outputs found

    Big Data Framework Using Spark Architecture for Dose Optimization Based on Deep Learning in Medical Imaging

    Get PDF
    Deep learning and machine learning provide more consistent tools and powerful functions for recognition, classification, reconstruction, noise reduction, quantification and segmentation in biomedical image analysis. Some breakthroughs. Recently, some applications of deep learning and machine learning for low-dose optimization in computed tomography have been developed. Due to reconstruction and processing technology, it has become crucial to develop architectures and/or methods based on deep learning algorithms to minimize radiation during computed tomography scan inspections. This chapter is an extension work done by Alla et al. in 2020 and explain that work very well. This chapter introduces the deep learning for computed tomography scan low-dose optimization, shows examples described in the literature, briefly discusses new methods for computed tomography scan image processing, and provides conclusions. We propose a pipeline for low-dose computed tomography scan image reconstruction based on the literature. Our proposed pipeline relies on deep learning and big data technology using Spark Framework. We will discuss with the pipeline proposed in the literature to finally derive the efficiency and importance of our pipeline. A big data architecture using computed tomography images for low-dose optimization is proposed. The proposed architecture relies on deep learning and allows us to develop effective and appropriate methods to process dose optimization with computed tomography scan images. The real realization of the image denoising pipeline shows us that we can reduce the radiation dose and use the pipeline we recommend to improve the quality of the captured image

    Polynomial modelling of ecg signals with applications to data compression

    No full text
    La compression des signaux ECG trouve encore plus d’importance avec le développement de la télémédecine. En effet, la compression permet de réduire considérablement les coûts de la transmission des informations médicales à travers les canaux de télécommunication. Notre objectif dans ce travail de thèse est d’élaborer des nouvelles méthodes de compression des signaux ECG à base des polynômes orthogonaux. Pour commencer, nous avons étudié les caractéristiques des signaux ECG, ainsi que différentes opérations de traitements souvent appliquées à ce signal. Nous avons aussi décrit de façon exhaustive et comparative, les algorithmes existants de compression des signaux ECG, en insistant sur ceux à base des approximations et interpolations polynomiales. Nous avons abordé par la suite, les fondements théoriques des polynômes orthogonaux, en étudiant successivement leur nature mathématique, les nombreuses et intéressantes propriétés qu’ils disposent et aussi les caractéristiques de quelques uns de ces polynômes. La modélisation polynomiale du signal ECG consiste d’abord à segmenter ce signal en cycles cardiaques après détection des complexes QRS, ensuite, on devra décomposer dans des bases polynomiales, les fenêtres de signaux obtenues après la segmentation. Les coefficients produits par la décomposition sont utilisés pour synthétiser les segments de signaux dans la phase de reconstruction. La compression revient à utiliser un petit nombre de coefficients pour représenter un segment de signal constitué d’un grand nombre d’échantillons. Nos expérimentations ont établi que les polynômes de Laguerre et les polynômes d’Hermite ne conduisaient pas à une bonne reconstruction du signal ECG. Par contre, les polynômes de Legendre et les polynômes de Tchebychev ont donné des résultats intéressants. En conséquence, nous concevons notre premier algorithme de compression de l’ECG en utilisant les polynômes de Jacobi. Lorsqu’on optimise cet algorithme en supprimant les effets de bords, il dévient universel et n’est plus dédié à la compression des seuls signaux ECG. Bien qu’individuellement, ni les polynômes de Laguerre, ni les fonctions d’Hermite ne permettent une bonne modélisation des segments du signal ECG, nous avons imaginé l’association des deux systèmes de fonctions pour représenter un cycle cardiaque. Le segment de l’ECG correspondant à un cycle cardiaque est scindé en deux parties dans ce cas: la ligne isoélectrique qu’on décompose en séries de polynômes de Laguerre et les ondes P-QRS-T modélisées par les fonctions d’Hermite. On obtient un second algorithme de compression des signaux ECG robuste et performant.Developing new ECG data compression methods has become more important with the implementation of telemedicine. In fact, compression schemes could considerably reduce the cost of medical data transmission through modern telecommunication networks. Our aim in this thesis is to elaborate compression algorithms for ECG data, using orthogonal polynomials. To start, we studied ECG physiological origin, analysed this signal patterns, including characteristic waves and some signal processing procedures generally applied ECG. We also made an exhaustive review of ECG data compression algorithms, putting special emphasis on methods based on polynomial approximations or polynomials interpolations. We next dealt with the theory of orthogonal polynomials. We tackled on the mathematical construction and studied various and interesting properties of orthogonal polynomials. The modelling of ECG signals with orthogonal polynomials includes two stages: Firstly, ECG signal should be divided into blocks after QRS detection. These blocks must match with cardiac cycles. The second stage is the decomposition of blocks into polynomial bases. Decomposition let to coefficients which will be used to synthesize reconstructed signal. Compression is the fact of using a small number of coefficients to represent a block made of large number of signal samples. We realised ECG signals decompositions into some orthogonal polynomials bases: Laguerre polynomials and Hermite polynomials did not bring out good signal reconstruction. Interesting results were recorded with Legendre polynomials and Tchebychev polynomials. Consequently, our first algorithm for ECG data compression was designed using Jacobi polynomials. This algorithm could be optimized by suppression of boundary effects, it then becomes universal and could be used to compress other types of signal such as audio and image signals. Although Laguerre polynomials and Hermite functions could not individually let to good signal reconstruction, we imagined an association of both systems of functions to realize ECG compression. For that matter, every block of ECG signal that matches with a cardiac cycle is split in two parts. The first part consisting of the baseline section of ECG is decomposed in a series of Laguerre polynomials. The second part made of P-QRS-T waves is modelled with Hermite functions. This second algorithm for ECG data compression is robust and very competitive

    Modélisations polynomiales des signaux ECG. Application à la compression.

    No full text
    Developing new ECG data compression methods has become more important with the implementation of telemedicine. In fact, compression schemes could considerably reduce the cost of medical data transmission through modern telecommunication networks. Our aim in this thesis is to elaborate compression algorithms for ECG data, using orthogonal polynomials. To start, we studied ECG physiological origin, analysed this signal patterns, including characteristic waves and some signal processing procedures generally applied ECG. We also made an exhaustive review of ECG data compression algorithms, putting special emphasis on methods based on polynomial approximations or polynomials interpolations. We next dealt with the theory of orthogonal polynomials. We tackled on the mathematical construction and studied various and interesting properties of orthogonal polynomials. The modelling of ECG signals with orthogonal polynomials includes two stages: Firstly, ECG signal should be divided into blocks after QRS detection. These blocks must match with cardiac cycles. The second stage is the decomposition of blocks into polynomial bases. Decomposition let to coefficients which will be used to synthesize reconstructed signal. Compression is the fact of using a small number of coefficients to represent a block made of large number of signal samples. We realised ECG signals decompositions into some orthogonal polynomials bases: Laguerre polynomials and Hermite polynomials did not bring out good signal reconstruction. Interesting results were recorded with Legendre polynomials and Tchebychev polynomials. Consequently, our first algorithm for ECG data compression was designed using Jacobi polynomials. This algorithm could be optimized by suppression of boundary effects, it then becomes universal and could be used to compress other types of signal such as audio and image signals. Although Laguerre polynomials and Hermite functions could not individually let to good signal reconstruction, we imagined an association of both systems of functions to realize ECG compression. For that matter, every block of ECG signal that matches with a cardiac cycle is split in two parts. The first part consisting of the baseline section of ECG is decomposed in a series of Laguerre polynomials. The second part made of P-QRS-T waves is modelled with Hermite functions. This second algorithm for ECG data compression is robust and very competitive.La compression des signaux ECG trouve encore plus d'importance avec le développement de la télémédecine. En effet, la compression permet de réduire considérablement les coûts de la transmission des informations médicales à travers les canaux de télécommunication. Notre objectif dans ce travail de thèse est d'élaborer des nouvelles méthodes de compression des signaux ECG à base des polynômes orthogonaux. Pour commencer, nous avons étudié les caractéristiques des signaux ECG, ainsi que différentes opérations de traitements souvent appliquées à ce signal. Nous avons aussi décrit de façon exhaustive et comparative, les algorithmes existants de compression des signaux ECG, en insistant sur ceux à base des approximations et interpolations polynomiales. Nous avons abordé par la suite, les fondements théoriques des polynômes orthogonaux, en étudiant successivement leur nature mathématique, les nombreuses et intéressantes propriétés qu'ils disposent et aussi les caractéristiques de quelques uns de ces polynômes. La modélisation polynomiale du signal ECG consiste d'abord à segmenter ce signal en cycles cardiaques après détection des complexes QRS, ensuite, on devra décomposer dans des bases polynomiales, les fenêtres de signaux obtenues après la segmentation. Les coefficients produits par la décomposition sont utilisés pour synthétiser les segments de signaux dans la phase de reconstruction. La compression revient à utiliser un petit nombre de coefficients pour représenter un segment de signal constitué d'un grand nombre d'échantillons. Nos expérimentations ont établi que les polynômes de Laguerre et les polynômes d'Hermite ne conduisaient pas à une bonne reconstruction du signal ECG. Par contre, les polynômes de Legendre et les polynômes de Tchebychev ont donné des résultats intéressants. En conséquence, nous concevons notre premier algorithme de compression de l'ECG en utilisant les polynômes de Jacobi. Lorsqu'on optimise cet algorithme en supprimant les effets de bords, il dévient universel et n'est plus dédié à la compression des seuls signaux ECG. Bien qu'individuellement, ni les polynômes de Laguerre, ni les fonctions d'Hermite ne permettent une bonne modélisation des segments du signal ECG, nous avons imaginé l'association des deux systèmes de fonctions pour représenter un cycle cardiaque. Le segment de l'ECG correspondant à un cycle cardiaque est scindé en deux parties dans ce cas: la ligne isoélectrique qu'on décompose en séries de polynômes de Laguerre et les ondes P-QRS-T modélisées par les fonctions d'Hermite. On obtient un second algorithme de compression des signaux ECG robuste et performant

    Modélisations polynomiales des signaux ECG : applications à la compression

    No full text
    Developing new ECG data compression methods has become more important with the implementation of telemedicine. In fact, compression schemes could considerably reduce the cost of medical data transmission through modern telecommunication networks. Our aim in this thesis is to elaborate compression algorithms for ECG data, using orthogonal polynomials. To start, we studied ECG physiological origin, analysed this signal patterns, including characteristic waves and some signal processing procedures generally applied ECG. We also made an exhaustive review of ECG data compression algorithms, putting special emphasis on methods based on polynomial approximations or polynomials interpolations. We next dealt with the theory of orthogonal polynomials. We tackled on the mathematical construction and studied various and interesting properties of orthogonal polynomials. The modelling of ECG signals with orthogonal polynomials includes two stages: Firstly, ECG signal should be divided into blocks after QRS detection. These blocks must match with cardiac cycles. The second stage is the decomposition of blocks into polynomial bases. Decomposition let to coefficients which will be used to synthesize reconstructed signal. Compression is the fact of using a small number of coefficients to represent a block made of large number of signal samples. We realised ECG signals decompositions into some orthogonal polynomials bases: Laguerre polynomials and Hermite polynomials did not bring out good signal reconstruction. Interesting results were recorded with Legendre polynomials and Tchebychev polynomials. Consequently, our first algorithm for ECG data compression was designed using Jacobi polynomials. This algorithm could be optimized by suppression of boundary effects, it then becomes universal and could be used to compress other types of signal such as audio and image signals. Although Laguerre polynomials and Hermite functions could not individually let to good signal reconstruction, we imagined an association of both systems of functions to realize ECG compression. For that matter, every block of ECG signal that matches with a cardiac cycle is split in two parts. The first part consisting of the baseline section of ECG is decomposed in a series of Laguerre polynomials. The second part made of P-QRS-T waves is modelled with Hermite functions. This second algorithm for ECG data compression is robust and very competitiveLa compression des signaux ECG trouve encore plus d'importance avec le développement de la télémédecine. En effet, la compression permet de réduire considérablement les coûts de la transmission des informations médicales à travers les canaux de télécommunication. Notre objectif dans ce travail de thèse est d'élaborer des nouvelles méthodes de compression des signaux ECG à base des polynômes orthogonaux. Pour commencer, nous avons étudié les caractéristiques des signaux ECG, ainsi que différentes opérations de traitements souvent appliquées à ce signal. Nous avons aussi décrit de façon exhaustive et comparative, les algorithmes existants de compression des signaux ECG, en insistant sur ceux à base des approximations et interpolations polynomiales. Nous avons abordé par la suite, les fondements théoriques des polynômes orthogonaux, en étudiant successivement leur nature mathématique, les nombreuses et intéressantes propriétés qu'ils disposent et aussi les caractéristiques de quelques uns de ces polynômes. La modélisation polynomiale du signal ECG consiste d'abord à segmenter ce signal en cycles cardiaques après détection des complexes QRS, ensuite, on devra décomposer dans des bases polynomiales, les fenêtres de signaux obtenues après la segmentation. Les coefficients produits par la décomposition sont utilisés pour synthétiser les segments de signaux dans la phase de reconstruction. La compression revient à utiliser un petit nombre de coefficients pour représenter un segment de signal constitué d'un grand nombre d'échantillons. Nos expérimentations ont établi que les polynômes de Laguerre et les polynômes d'Hermite ne conduisaient pas à une bonne reconstruction du signal ECG. Par contre, les polynômes de Legendre et les polynômes de Tchebychev ont donné des résultats intéressants. En conséquence, nous concevons notre premier algorithme de compression de l'ECG en utilisant les polynômes de Jacobi. Lorsqu'on optimise cet algorithme en supprimant les effets de bords, il dévient universel et n'est plus dédié à la compression des seuls signaux ECG. Bien qu'individuellement, ni les polynômes de Laguerre, ni les fonctions d'Hermite ne permettent une bonne modélisation des segments du signal ECG, nous avons imaginé l'association des deux systèmes de fonctions pour représenter un cycle cardiaque. Le segment de l'ECG correspondant à un cycle cardiaque est scindé en deux parties dans ce cas: la ligne isoélectrique qu'on décompose en séries de polynômes de Laguerre et les ondes P-QRS-T modélisées par les fonctions d'Hermite. On obtient un second algorithme de compression des signaux ECG robuste et performan

    Towards an automated medical diagnosis system for intestinal parasitosis

    No full text
    Human parasites are a real public health problem in tropical countries, especially in underdeveloped countries. Usually, the medical diagnosis of intestinal parasites is carried out in the laboratory by visual analysis of stools samples using the optical microscope. The parasite recognition is realized by comparing its shape with the known forms. We offer a solution to automate the diagnosis of intestinal parasites through their images obtained from a microscope connected directly to a computer. Our approach exploits the contour detection based on the multi-scale wavelet transform for detecting the parasite. Active contours are combined with the Hough transform to perform image segmentation and extraction of the parasite. We used the principal component analysis for the extraction and reduction of features obtained directly from pixels of the extracted parasite image. Our classification tool is based on the probabilistic neural network. The obtained algorithms were tested on 900 samples of microscopic images of 15 different species of intestinal parasites. The result shows a 100% recognition rate of success. Keywords: Intestinal parasites, Wavelet edge detector, Hough transform, Active contours, Probabilistic neural networ

    An optimal big data workflow for biomedical image analysis

    No full text
    Background and objective: In the medical field, data volume is increasingly growing, and traditional methods cannot manage it efficiently. In biomedical computation, the continuous challenges are: management, analysis, and storage of the biomedical data. Nowadays, big data technology plays a significant role in the management, organization, and analysis of data, using machine learning and artificial intelligence techniques. It also allows a quick access to data using the NoSQL database. Thus, big data technologies include new frameworks to process medical data in a manner similar to biomedical images. It becomes very important to develop methods and/or architectures based on big data technologies, for a complete processing of biomedical image data. Method: This paper describes big data analytics for biomedical images, shows examples reported in the literature, briefly discusses new methods used in processing, and offers conclusions. We argue for adapting and extending related work methods in the field of big data software, using Hadoop and Spark frameworks. These provide an optimal and efficient architecture for biomedical image analysis. This paper thus gives a broad overview of big data analytics to automate biomedical image diagnosis. A workflow with optimal methods and algorithm for each step is proposed. Results: Two architectures for image classification are suggested. We use the Hadoop framework to design the first, and the Spark framework for the second. The proposed Spark architecture allows us to develop appropriate and efficient methods to leverage a large number of images for classification, which can be customized with respect to each other. Conclusions: The proposed architectures are more complete, easier, and are adaptable in all of the steps from conception. The obtained Spark architecture is the most complete, because it facilitates the implementation of algorithms with its embedded libraries. Keywords: Biomedical images, Big data, Artificial intelligence, Machine learning, Hadoop/spar

    Automating the clinical stools exam using image processing integrated in an expert system

    No full text
    International audienceBackground and objective: The diagnosis of intestinal parasitosis disease relies on physiological symptoms and stool examination. Often, few specialists are available, and manual stool exam is slow, prone to error, and can cause eye fatigue. Our aim was to design and implement a medical expert system that would be automated and helpful for diagnosis of human intestinal parasitosis.Methods: The system was developed based on a decision algorithm. A knowledge base was constructed through information gleaned from books and physicians with information pertaining to the disease. The user interacts with the system by answering questions. The symptoms information collected led to a microscopic examination of stools, which was run on the system to detect parasites. The paradigm for automated microscopic examination of stools consisted of a combined distance regularized level set evolution, automatically initialized by a circular Hough transform, and a trained neuro-fuzzy classifier. The neuro-fuzzy classifier was trained for analysis of twenty human intestinal parasites.Results: We combined the reasoning scheme of diagnosis and the automated clinical exam of stools in the same system. The parasites found in microscopic imagery confirmed the suspicious disease. The final recommendation of diagnosis was then completed, with appropriate proposed therapy. The system was evaluated with sixty cases of infection, and compared to the diagnosis of two expert doctors; we obtained fifty eight correct diagnoses, corresponding to a 96.6% accuracy.Conclusions: The proposed system is automated, since the parameters of segmentation, feature extraction and classification are set to be computationally guided by the type of suspicious parasite. The system is potentially an important contribution for medical healthcare assistance

    Biomedical Image Classification in a Big Data Architecture Using Machine Learning Algorithms

    No full text
    In modern-day medicine, medical imaging has undergone immense advancements and can capture several biomedical images from patients. In the wake of this, to assist medical specialists, these images can be used and trained in an intelligent system in order to aid the determination of the different diseases that can be identified from analyzing these images. Classification plays an important role in this regard; it enhances the grouping of these images into categories of diseases and optimizes the next step of a computer-aided diagnosis system. The concept of classification in machine learning deals with the problem of identifying to which set of categories a new population belongs. When category membership is known, the classification is done on the basis of a training set of data containing observations. The goal of this paper is to perform a survey of classification algorithms for biomedical images. The paper then describes how these algorithms can be applied to a big data architecture by using the Spark framework. This paper further proposes the classification workflow based on the observed optimal algorithms, Support Vector Machine and Deep Learning as drawn from the literature. The algorithm for the feature extraction step during the classification process is presented and can be customized in all other steps of the proposed classification workflow
    corecore